-
1 MDP
1) Общая лексика: Multi-Dimensional Processing2) Медицина: Manic Depressive Psychotic, maximum detrusor pressure3) Американизм: Managed Disability Program4) Военный термин: Major Defense Program, Mega Damage Point, Ministry Of Defence Police, Multi Destination Protocol, maintenance display panel, malfunction detection package, malicious destruction of property, master display panel, materiel deficiency program, meteorological datum plane, mine development program, missile data processor6) Математика: марковский процесс принятия решений (Markovian decision process)7) Юридический термин: Maldivian Drug Party8) Автомобильный термин: manifold differential pressure9) Музыка: Midi Digital Percussion10) Политика: Michigan Democratic Party11) Сокращение: Mail Design Professional (2004), Mailpiece Design Professional (USPS professional certification program, 2005), Maintenance Data Panel, Ministry of Defence Police (UK), Mnogofunktsionalnyi Dalnyi Pierekhvatchik (Multi-function long range interceptor (Russia)), Modular Display Processor, Multi-Designation Protocol (Communications protocol)12) Вычислительная техника: Message Driven Processor13) Транспорт: Moving Deformable Barrier14) Деловая лексика: Major Design Project, Market Development Program, Microsoft Developer Project, Most Discussed Problem, Multi Disciplinary Practice, Multi Disciplinary Project15) Сетевые технологии: процессор, управляемый сообщениями16) Расширение файла: Project workspace (MS Developer Studio)17) SAP.тех. профиль основных данных18) Электричество: Главная панель распределения (Main distribution panel)19) Имена и фамилии: Michael D. Parker20) NYSE. Meredith Corporation -
2 model
1) модель (напр. экономики)2) тип, марка конструкции, модель (напр. автомобиля) -
3 system
1) система; способ; метод2) устройство; строй3) классификация4) учение5) сеть (дорог)
См. также в других словарях:
Markov decision process — Markov decision processes (MDPs), named after Andrey Markov, provide a mathematical framework for modeling decision making in situations where outcomes are partly random and partly under the control of a decision maker. MDPs are useful for… … Wikipedia
Partially observable Markov decision process — A Partially Observable Markov Decision Process (POMDP) is a generalization of a Markov Decision Process. A POMDP models an agent decision process in which it is assumed that the system dynamics are determined by an MDP, but the agent cannot… … Wikipedia
Markov process — In probability theory and statistics, a Markov process, named after the Russian mathematician Andrey Markov, is a time varying random phenomenon for which a specific property (the Markov property) holds. In a common description, a stochastic… … Wikipedia
Pontis — is a software application developed to assist in managing highway bridges and other structures. Pontis stores bridge inspection and inventory data based on the U.S. Federal Highway Administration (FHWA) National Bridge Inventory System (NBIS)… … Wikipedia
Markov chain — A simple two state Markov chain. A Markov chain, named for Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process characterized … Wikipedia
List of mathematics articles (M) — NOTOC M M estimator M group M matrix M separation M set M. C. Escher s legacy M. Riesz extension theorem M/M/1 model Maass wave form Mac Lane s planarity criterion Macaulay brackets Macbeath surface MacCormack method Macdonald polynomial Machin… … Wikipedia
Markov property — In probability theory and statistics, the term Markov property refers to the memoryless property of a stochastic process. It was named after the Russian mathematician Andrey Markov.[1] A stochastic process has the Markov property if the… … Wikipedia
probability theory — Math., Statistics. the theory of analyzing and making statements concerning the probability of the occurrence of uncertain events. Cf. probability (def. 4). [1830 40] * * * Branch of mathematics that deals with analysis of random events.… … Universalium